Attentive Tensor Product Learning for Language Generation and Grammar Parsing
نویسندگان
چکیده
This paper proposes a new architecture — Attentive Tensor Product Learning (ATPL) — to represent grammatical structures in deep learning models. ATPL is a new architecture to bridge this gap by exploiting Tensor Product Representations (TPR), a structured neural-symbolic model developed in cognitive science, aiming to integrate deep learning with explicit language structures and rules. The key ideas of ATPL are: 1) unsupervised learning of role-unbinding vectors of words via TPR-based deep neural network; 2) employing attention modules to compute TPR; and 3) integration of TPR with typical deep learning architectures including Long Short-Term Memory (LSTM) and Feedforward Neural Network (FFNN). The novelty of our approach lies in its ability to extract the grammatical structure of a sentence by using role-unbinding vectors, which are obtained in an unsupervised manner. This ATPL approach is applied to 1) image captioning, 2) part of speech (POS) tagging, and 3) constituency parsing of a sentence. Experimental results demonstrate the effectiveness of the proposed approach.
منابع مشابه
Inductive vs. Deductive Grammar Instruction and the Grammatical Performance of EFL Learners
Learning a foreign language offers a great challenge to students since it involves learning different skills and subskills. Quite a few number of researches have been done so far on the relationship between gender and learning a foreign language. On the other hand, two major approaches in teaching grammar have been offered by language experts, inductive and deductive. The present study examines...
متن کاملLearning unification-based natural language grammars
Practical text processing systems need wide covering grammars. When parsing unrestricted language, such grammars often fail to generate all of the sentences that humans would judge to be grammatical. This problem undermines successful parsing of the text and is known as undergeneration. There are two main ways of dealing with undergeneration: either by sentence correction, or by grammar correct...
متن کاملAdcs 2010
Wikimedia article archives (Wikipedia, Wiktionary, and so on) assemble open-access, authoritative corpora for semantic-informed datamining, machine learning, information retrieval, and natural language processing. In this paper, we show the MediaWiki wikitext grammar to be context-sensitive, thus precluding application of simple parsing techniques. We show there exists a worst-case bound on tim...
متن کاملThe Impact of Structured Input-based Tasks on L2 Learners’ Grammar Learning
Abstract Task-based language teaching has received increased attention in second language research. However, the combination of structured input-based approach and task-based language teaching has not been examined in relation to L2 grammar learning. To address this gap, the present study investigated how the structured input-based tasks with and without explicit information impacted learners’ ...
متن کاملThe Impact of Structured Input-based Tasks on L2 Learners’ Grammar Learning
Abstract Task-based language teaching has received increased attention in second language research. However, the combination of structured input-based approach and task-based language teaching has not been examined in relation to L2 grammar learning. To address this gap, the present study investigated how the structured input-based tasks with and without explicit information impacted learners’ ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1802.07089 شماره
صفحات -
تاریخ انتشار 2018